Computational Analogues of Entropy
نویسندگان
چکیده
Min-entropy is a statistical measure of the amount of randomness that a particular distribution contains. In this paper we investigate the notion of computational min-entropy which is the computational analog of statistical min-entropy. We consider three possible definitions for this notion, and show equivalence and separation results for these definitions in various computational models. We also study whether or not certain properties of statistical min-entropy have a computational analog. In particular, we consider the following questions: 1. Let X be a distribution with high computational min-entropy. Does one get a pseudorandom distribution when applying a “randomness extractor” on X? 2. Let X and Y be (possibly dependent) random variables. Is the computational min-entropy of (X,Y ) at least as large as the computational min-entropy of X? 3. Let X be a distribution over {0, 1} that is “weakly unpredictable” in the sense that it is hard to predict a constant fraction of the coordinates of X with a constant bias. Does X have computational min-entropy Ω(n)? We show that the answers to these questions depend on the computational model considered. In some natural models the answer is false and in others the answer is true. Our positive results for the third question exhibit models in which the “hybrid argument bottleneck” in “moving from a distinguisher to a predictor” can be avoided.
منابع مشابه
A Computational Study on the Configurational Behaviors of Dihalodiazenes and their Analogues Containing P and As Atoms
In this research, we report the results of DFT calculations using xc-hybrid functional, B3LYP and employ NBO interpretation to investigate the stereoelectronic effects. Electrostatic and steric impacts on the conformational properties of 1,2-difluorodiazene (1), 1,2-dichlorodiazene (2) and 1,2-dibromodiazene (3) are also studied. Factors determining the thermodynamically stable molecular struct...
متن کاملCharacterization of several kinds of quantum analogues of relative entropy
Received (received date) Revised (revised date) Quantum relative entropy D(ρσ) def = Tr ρ(log ρ − log σ) plays an important role in quantum information and related fields. However, there are many quantum analogues of relative entropy. In this paper, we characterize these analogues from information geometrical viewpoint. We also consider the naturalness of quantum relative entropy among these an...
متن کاملEstimation of the Entropy Rate of ErgodicMarkov Chains
In this paper an approximation for entropy rate of an ergodic Markov chain via sample path simulation is calculated. Although there is an explicit form of the entropy rate here, the exact computational method is laborious to apply. It is demonstrated that the estimated entropy rate of Markov chain via sample path not only converges to the correct entropy rate but also does it exponential...
متن کاملThermodynamical analogues in quantum information theory
The first step in quantum information theory is the identification of entanglement as a valuable resource. The next step is learning how to exploit this resource efficiently. We learn how to exploit entanglement efficiently by applying analogues of thermodynamical concepts. These concepts include reversibility, entropy, and the distinction between intensive and extensive quantities. We discuss ...
متن کاملPermutation Complexity and Coupling Measures in Hidden Markov Models
In [Haruna, T. and Nakajima, K., 2011. Physica D 240, 13701377], the authors introduced the duality between values (words) and orderings (permutations) as a basis to discuss the relationship between information theoretic measures for finite-alphabet stationary stochastic processes and their permutation analogues. It has been used to give a simple proof of the equality between the entropy rate a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003